This study proposes a music-aided framework for affective interaction of service robots with humans. The\r\nframework consists of three systems, respectively, for perception, memory, and expression on the basis of the\r\nhuman brain mechanism. We propose a novel approach to identify human emotions in the perception system.\r\nThe conventional approaches use speech and facial expressions as representative bimodal indicators for emotion\r\nrecognition. But, our approach uses the mood of music as a supplementary indicator to more correctly determine\r\nemotions along with speech and facial expressions. For multimodal emotion recognition, we propose an effective\r\ndecision criterion using records of bimodal recognition results relevant to the musical mood. The memory and\r\nexpression systems also utilize musical data to provide natural and affective reactions to human emotions. For\r\nevaluation of our approach, we simulated the proposed human-robot interaction with a service robot, iRobiQ. Our\r\nperception system exhibited superior performance over the conventional approach, and most human participants\r\nnoted favorable reactions toward the music-aided affective interaction.
Loading....